AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
16-Expert Dynamic Fusion

# 16-Expert Dynamic Fusion

Llama 4 Scout 17B 4E Instruct
Llama 4 Scout is a 17-billion-parameter multimodal model with a Mixture of Experts (MoE) architecture, introduced by Meta. It supports 12 languages and image understanding, featuring a topk=4 expert dynamic fusion mechanism.
Large Language Model Transformers Supports Multiple Languages
L
shadowlilac
53
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase